19 research outputs found

    A novel tactile display for softness and texture rendering in tele-operation tasks

    Get PDF
    Softness and texture high-frequency information represent fundamental haptic properties for every day life activities and environment tactual exploration. While several displays have been produced to convey either softness or high-frequency information, there is no or little evidence of systems that are able to reproduce both these properties in an integrated fashion. This aspect is especially crucial in medical tele-operated procedures, where roughness and stiffness of human tissues are both important to correctly identify given pathologies through palpation (e.g. in tele-dermatology). This work presents a fabric yielding display (FYD-pad), a fabric-based tactile display for softness and texture rendering. The system exploits the control of two motors to modify both the stretching state of the elastic fabric for softness rendering and to convey texture information on the basis of accelerometer-based data. At the same time, the measurement of the contact area can be used to control remote or virtual robots. In this paper, we discuss the architecture of FYD-pad and the techniques used for softness and texture reproduction as well as for synthesizing probe-surface interactions from real data. Tele-operation examples and preliminary experiments with humans are reported, which show the effectiveness of the device in delivering both softness and texture information

    Design of an under-actuated wrist based on adaptive synergies

    Get PDF
    An effective robotic wrist represents a key enabling element in robotic manipulation, especially in prosthetics. In this paper, we propose an under-actuated wrist system, which is also adaptable and allows to implement different under-actuation schemes. Our approach leverages upon the idea of soft synergies - in particular the design method of adaptive synergies - as it derives from the field of robot hand design. First we introduce the design principle and its implementation and function in a configurable test bench prototype, which can be used to demonstrate the feasibility of our idea. Furthermore, we report on results from preliminary experiments with humans, aiming to identify the most probable wrist pose during the pre-grasp phase in activities of daily living. Based on these outcomes, we calibrate our wrist prototype accordingly and demonstrate its effectiveness to accomplish grasping and manipulation tasks

    LHF Connect: a DIY telepresence robot against COVID-19

    Get PDF
    This contribution describes a case study of a “do-it-yourself” (DIY) opensource service and related product to help combating the COVID-19 emergency. It illustrates the birth of LHF Connect, a project designed to facilitate communication between patients isolated in COVID-19 hospitals’ ward and their relatives. LHF Connect is a teleoperated robot that can move in autonomy around the hospital. A User Centered Design approach, methods and specific tools helped in managing crucial steps of the design process such as i) the collection of needs coming from the context, stakeholders and end-users; ii) defining the service blueprint; iii) imagining finishing concepts; and iv) managing the communication activities. The initiative has been promoted by a multidisciplinary team of researchers (mainly roboticists with the help of specific competences coming from Design discipline)

    Integration of a Passive Exoskeleton and a Robotic Supernumerary Finger for Grasping Compensation in Chronic Stroke Patients: The SoftPro Wearable System

    Get PDF
    Upper-limb impairments are all-pervasive in Activities of Daily Living (ADLs). As a consequence, people affected by a loss of arm function must endure severe limitations. To compensate for the lack of a functional arm and hand, we developed a wearable system that combines different assistive technologies including sensing, haptics, orthotics and robotics. The result is a device that helps lifting the forearm by means of a passive exoskeleton and improves the grasping ability of the impaired hand by employing a wearable robotic supernumerary finger. A pilot study involving 3 patients, which was conducted to test the capability of the device to assist in performing ADLs, confirmed its usefulness and serves as a first step in the investigation of novel paradigms for robotic assistance

    A novel tactile display for softness and texture rendering in human-robot and tele-operation applications

    No full text
    The aim of this thesis is to design and engineer a tactile display for softness and texture rendering to be used as a master in human-robot and teleoperation applications. In recent years, many types of teleoperation approaches, i.e. the control of a machine at a distance, have been developed using haptic interfaces to control a virtual simulated / remote manipulator. Among the different haptic properties to be rendered, softness plays a crucial role to guarantee an effective interaction with a remote environment. Furthermore, haptics research has produced several efforts to understand and recreate high-frequency texture information to improve the quality of haptic feedback in both real and virtual environments. To the best of our knowledge, trying to reproduce both these types of information in a haptic device for teleoperation tasks is still an unexplored topic. In this thesis, we designed and fabricated a novel fabric-based device, hereinafter referred as FYD Touchpad, that can be used to tele-operate a remote robot (through the tracking of the contact area) and to convey haptic stimuli. More specifically, this system is able to perform digital texture rendering through Pulse Width Modulation of two DC motors. Additionally, while softness information is conveyed by modulating the stretching state of the fabric, the dynamic movement of the user finger on the elastic fabric allows to remotely control a robot linked through a network, by tracking contact area location on the fabric. At the same time, the user can experience texture and softness information the robot end effector is sensing at the remote-side. The device can be easily interfaced with different manipulators since it communicates with external robots using a protocol based on packet exchange, with both wired and wireless networks. The haptic interface was tested using a KUKA 7-DoF manipulator for remote control. Several objects with different combined softness and roughness properties were remotely explored. Their properties were then suitably reproduced by FYD Touchpad. Experimental results on the correlation between the signals on the master and slave side show the effectiveness of the here proposed system and techniques

    A Wearable Fabric-based display for haptic multi-cue delivery

    No full text
    Softness represents one of the most informative haptic properties, which plays a fundamental role in both everyday tasks and more complex procedures. Thus, it is not surprising that much effort has been devoted to designing haptic systems able to suitably reproduce this information. At the same time, wearability has gained an increasing importance as a novel paradigm to enable a more effective and naturalistic human robot interaction. Capitalizing upon our previous works on grounded softness devices, in this paper we present the Wearable Fabric Yielding Display (W-FYD), a fabric-based tactile display for multi-cue delivery that can be worn by user's finger. W-FYD enables to implement both passive and active tactile exploration. Different levels of stiffness can be reproduced by modulating the stretching state of a fabric through two DC motors. An additional vertical degree of freedom is implemented through a lifting mechanism, which enables to convey softness stimuli to the user's finger pad. Furthermore, a sliding effect on the finger can be also induced. Experiments with humans show the effectiveness of W-FYD for haptic multi-cue delivery

    An Integrated Approach to Characterize the Behavior of a Human Fingertip in Contact with a Silica Window

    No full text
    Understanding the mechanisms of human tactual perception represents a challenging task in haptics and humanoid robotics. A classic approach to tackle this issue is to accurately and exhaustively characterize the mechanical behaviour of human fingertip. The output of this characterization can then be exploited to drive the design of numerical models, which can be used to investigate in depth the mechanisms of human sensing. In this work, we present a novel integrated measurement technique and experimental set up for in vivo characterization of the deformation of the human fingertip at contact, in terms of contact area, force, deformation and pressure distribution. The device presented here compresses the participant's fingertip against a flat surface, while the aforementioned measurements are acquired and experimental parameters such as velocity, finger orientation and displacement (indentation) controlled. Experimental outcomes are then compared and integrated with the output of a 3D finite element (FE) model of the human fingertip, built upon existing validated models. The agreement between numerical and experimental data represents a validation for our approach

    Touch-Based Grasp Primitives for Soft Hands: Applications to Human-to-Robot Handover Tasks and Beyond

    No full text
    Recently, the avenue of adaptable, soft robotic hands has opened simplified opportunities to grasp different items; however, the potential of soft end effectors (SEEs) is still largely unexplored, especially in human-robot interaction. In this paper, we propose, for the first time, a simple touch-based approach to endow a SEE with autonomous grasp sensory-motor primitives, in response to an item passed to the robot by a human (human-to-robot handover). We capitalize on human inspiration and minimalistic sensing, while hand adaptability is exploited to generalize grasp response to different objects. We consider the Pisa/IIT SoftHand (SH), an under-actuated soft anthropomorphic robotic hand, which is mounted on a robotic arm and equipped with Inertial Measurement Units (IMUs) on the fingertips. These sensors detect the accelerations arisen from contact with external items. In response to a contact, the hand pose and closure are planned for grasping, by executing arm motions with hand closure commands. We generate these motions from human wrist poses acquired from a human maneuvering the SH to grasp an object from a table. We obtained 86% of successful grasps, considering many objects passed to the SH in different manners. We also tested our techniques in preliminary experiments, where the robot moved to autonomously grasp objects from a surface. Results are positive and open interesting perspectives for soft robotic manipulation

    Humanoids at Work: The WALK-MAN Robot in a Postearthquake Scenario

    No full text
    Today, human intervention is the only effective course of action after a natural or artificial disaster. This is true both for relief operations, where search and rescue of survivors is the priority, and for subsequent activities, such as those devoted to building assessment. In these contexts, the use of robotic systems would be beneficial to drastically reduce operators? risk exposure. However, the readiness level of robots still prevents their effective exploitation in relief operations, which are highly critical and characterized by severe time constraints. On the contrary, current robotic technologies can be profitably applied in procedures like building assessment after an earthquake. To date, these operations are carried out by engineers and architects who inspect numerous buildings over a large territory, with a high cost in terms of time and resources, and with a high risk due to aftershocks. The main idea is to have the robot acting as an alter ego of the human operator, who, thanks to a virtual-reality device and a body-tracking system based on inertial sensors, teleoperates the robot
    corecore